skip to main content


Search for: All records

Creators/Authors contains: "Bursik, Marcus I."

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Abstract Tephra is a unique volcanic product with an unparalleled role in understanding past eruptions, long-term behavior of volcanoes, and the effects of volcanism on climate and the environment. Tephra deposits also provide spatially widespread, high-resolution time-stratigraphic markers across a range of sedimentary settings and thus are used in numerous disciplines (e.g., volcanology, climate science, archaeology). Nonetheless, the study of tephra deposits is challenged by a lack of standardization that inhibits data integration across geographic regions and disciplines. We present comprehensive recommendations for tephra data gathering and reporting that were developed by the tephra science community to guide future investigators and to ensure that sufficient data are gathered for interoperability. Recommendations include standardized field and laboratory data collection, reporting and correlation guidance. These are organized as tabulated lists of key metadata with their definition and purpose. They are system independent and usable for template, tool, and database development. This standardized framework promotes consistent documentation and archiving, fosters interdisciplinary communication, and improves effectiveness of data sharing among diverse communities of researchers. 
    more » « less
  2. Abstract. We detail a new prediction-oriented procedure aimed at volcanic hazardassessment based on geophysical mass flow models constrained withheterogeneous and poorly defined data. Our method relies on an itemizedapplication of the empirical falsification principle over an arbitrarily wideenvelope of possible input conditions. We thus provide a first step towards aobjective and partially automated experimental design construction. Inparticular, instead of fully calibrating model inputs on past observations,we create and explore more general requirements of consistency, and then weseparately use each piece of empirical data to remove those input values thatare not compatible with it. Hence, partial solutions are defined to the inverseproblem. This has several advantages compared to a traditionally posedinverse problem: (i) the potentially nonempty inverse images of partialsolutions of multiple possible forward models characterize the solutions tothe inverse problem; (ii) the partial solutions can provide hazard estimatesunder weaker constraints, potentially including extreme cases that areimportant for hazard analysis; (iii) if multiple models are applicable,specific performance scores against each piece of empirical information canbe calculated. We apply our procedure to the case study of the Atenquiquevolcaniclastic debris flow, which occurred on the flanks of Nevado de Colimavolcano (Mexico), 1955. We adopt and compare three depth-averaged modelscurrently implemented in the TITAN2D solver, available from https://vhub.org(Version 4.0.0 – last access: 23 June 2016). The associated inverse problemis not well-posed if approached in a traditional way. We show that our procedurecan extract valuable information for hazard assessment, allowing the explorationof the impact of synthetic flows that are similar to those that occurred in thepast but different in plausible ways. The implementation of multiple models isthus a crucial aspect of our approach, as they can allow the covering of otherplausible flows. We also observe that model selection is inherently linked tothe inversion problem.

     
    more » « less
  3. Abstract

    Ideally, probabilistic hazard assessments combine available knowledge about physical mechanisms of the hazard, data on past hazards, and any precursor information. Systematically assessing the probability of rare, yet catastrophic hazards adds a layer of difficulty due to limited observation data. Via computer models, one can exercise potentially dangerous scenarios that may not have happened in the past but are probabilistically consistent with the aleatoric nature of previous volcanic behavior in the record. Traditional Monte Carlo‐based methods to calculate such hazard probabilities suffer from two issues: they are computationally expensive, and they are static. In light of new information, newly available data, signs of unrest, and new probabilistic analysis describing uncertainty about scenarios the Monte Carlo calculation would need to be redone under the same computational constraints. Here we present an alternative approach utilizing statistical emulators that provide an efficient way to overcome the computational bottleneck of typical Monte Carlo approaches. Moreover, this approach is independent of an aleatoric scenario model and yet can be applied rapidly to any scenario model making it dynamic. We present and apply this emulator‐based approach to create multiple probabilistic hazard maps for inundation of pyroclastic density currents in the Long Valley Volcanic Region. Further, we illustrate how this approach enables an exploration of the impact of epistemic uncertainties on these probabilistic hazard forecasts. Particularly, we focus on the uncertainty of vent opening models and how that uncertainty both aleatoric and epistemic impacts the resulting probabilistic hazard maps of pyroclastic density current inundation.

     
    more » « less